Recursion-Free Online Multiple Incremental/Decremental Analysis Based on Ridge Support Vector Learning
نویسنده
چکیده
Abstract—This study presents a rapid multiple incremental and decremental mechanism based on Weight-Error Curves (WECs) for support-vector analysis. To handle rapidly increasing amounts of data, recursion-free computation is proposed for predicting the Lagrangian multipliers of new samples. This study examines the characteristics of Ridge Support Vector Models, including Ridge Support Vector Machines and Regression, subsequently devising a recursion-free function derived from WECs. With this proposed function, all of the new Lagrangian multipliers can be computed at once without using any gradual step sizes. Moreover, such a function can relax a constraint, where the increment of new multiple Lagrangian multipliers should be the same in the previous work, thereby easily satisfying the requirement of Karush–Kuhn–Tucker (KKT) conditions. The proposed mechanism no longer requires typical time-consuming bookkeeping strategies, which compute the step size by checking all the training samples in each incremental round. Experiments were carried out on open datasets for evaluating our work. The results showed that the computational speed was successfully enhanced, better than the baselines. Besides, the accuracy still remained. These findings revealed that the proposed method was appropriate for incremental/decremental learning, thereby demonstrating the effectiveness of the proposed idea.
منابع مشابه
Incremental and Decremental Support Vector Machine Learning
An on-line recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the KuhnTucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is reversible, and decremental “unlearning” offers an efficient method to exactly evaluate leave-one-out generalization perfo...
متن کاملIncremental and Decremental Proximal Support Vector Classification using Decay Coefficients
This paper presents an efficient approach for supporting decremental learning for incremental proximal support vector machines (SVM). The presented decremental algorithm based on decay coefficients is compared with an existing window-based decremental algorithm, and is shown to perform at a similar level in accuracy, but providing significantly better computational performance.
متن کاملIntroducing evolving Takagi-Sugeno method based on local least squares support vector machine models
In this study, an efficient local online identification method based on the evolving Takagi–Sugeno least square support vector machine (eTS-LS-SVM) for nonlinear time series prediction is introduced. As an innovation, this paper has applied the nonlinear models, i.e. local LSSVM models, as the consequence parts of the fuzzy rules, instead of the linear models used in the conventional evolving T...
متن کاملImplementation Issues of an Incremental and Decremental SVM
Incremental and decremental processes of training a support vector machine (SVM) resumes to the migration of vectors in and out of the support set along with modifying the associated thresholds. This paper gives an overview of all the boundary conditions implied by vector migration through the incremental / decremental process. The analysis will show that the same procedures, with very slight v...
متن کاملEfficient multiple incremental computation for Kernel Ridge Regression with Bayesian uncertainty modeling
Abstract—This study presents an energy-economic approach for incremental/decremental learning based on kernel ridge regression, a frequently used regressor on clouds. To avoid reanalyzing the entire dataset when data change every time, the proposed mechanism supports incremental/decremental processing for both single and multiple samples (i.e., batch processing). Moreover, incremental/decreme...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016